21 research outputs found

    Improved Measures of Integrated Information

    Get PDF
    Although there is growing interest in measuring integrated information in computational and cognitive systems, current methods for doing so in practice are computationally unfeasible. Existing and novel integration measures are investigated and classified by various desirable properties. A simple taxonomy of Φ-measures is presented where they are each characterized by their choice of factorization method (5 options), choice of probability distributions to compare (3 × 4 options) and choice of measure for comparing probability distributions (7 options). When requiring the Φ-measures to satisfy a minimum of attractive properties, these hundreds of options reduce to a mere handful, some of which turn out to be identical. Useful exact and approximate formulas are derived that can be applied to real-world data from laboratory experiments without posing unreasonable computational demands.United States. Army Research Office (Grant W911NF-15-1-0300

    Research Priorities for Robust and Beneficial Artificial Intelligence

    Get PDF
    Artificial intelligence (AI) research has explored a variety of problems and approaches since its inception, but for the last 20 years or so has been focused on the problems surrounding the construction of intelligent agents —systems that perceive and act in some environment. In this context, the criterion for intelligence is related to statistical and economic notions of rationality — colloquially, the ability to make good decisions, plans, or inferences. The adoption of probabilistic representations and statistical learning methods has led to a large degree of integration and cross-fertilization between AI, machine learning, statistics, control theory, neuroscience, and other fields. The establishment of shared theoretical frameworks, combined with the availability of data and processing power, has yielded remarkable suc- cesses in various component tasks such as speech recognition, image classification, autonomous vehicles, machine translation, legged locomotion, and question-answering systems

    An Improved Method for 21cm Foreground Removal

    Get PDF
    21-cm tomography is expected to be difficult in part because of serious foreground contamination. Previous studies have found that line-of-sight approaches are capable of cleaning foregrounds to an acceptable level on large spatial scales, but not on small spatial scales. In this paper, we introduce a Fourier space formalism for describing the line-of-sight methods, and use it to introduce an improved new method for 21-cm foreground cleaning. Heuristically, this method involves fitting foregrounds in Fourier space using weighted polynomial fits, with each pixel weighted according to its information content. We show that the new method reproduces the old one on large angular scales, and gives marked improvements on small scales at essentially no extra computational cost.National Science Foundation (U.S.) (Grant AST-0134999)National Science Foundation (U.S.) (Grant AST-05-06556)David & Lucile Packard FoundationResearch Corporatio

    Mapmaking for precision 21 cm cosmology

    Get PDF
    In order to study the “Cosmic Dawn” and the Epoch of Reionization with 21 cm tomography, we need to statistically separate the cosmological signal from foregrounds known to be orders of magnitude brighter. Over the last few years, we have learned much about the role our telescopes play in creating a putatively foreground-free region called the “EoR window.” In this work, we examine how an interferometer’s effects can be taken into account in a way that allows for the rigorous estimation of 21 cm power spectra from interferometric maps while mitigating foreground contamination and thus increasing sensitivity. This requires a precise understanding of the statistical relationship between the maps we make and the underlying true sky. While some of these calculations would be computationally infeasible if performed exactly, we explore several well-controlled approximations that make mapmaking and the calculation of map statistics much faster, especially for compact and highly redundant interferometers designed specifically for 21 cm cosmology. We demonstrate the utility of these methods and the parametrized trade-offs between accuracy and speed using one such telescope, the upcoming Hydrogen Epoch of Reionization Array, as a case study.National Science Foundation (U.S.) (Grant AST-0457585)National Science Foundation (U.S.) (Grant AST-0821321)National Science Foundation (U.S.) (Grant AST-0804508)National Science Foundation (U.S.) (Grant AST-1105835)National Science Foundation (U.S.) (Grant AST-1125558)National Science Foundation (U.S.) (Grant AST-1129258)National Science Foundation (U.S.) (Grant AST-1410484)National Science Foundation (U.S.) (Grant AST-1411622)Mount Cuba Astronomical AssociationMIT School of ScienceMarble Astrophysics Fun

    Brute-Force Mapmaking with Compact Interferometers: A MITEoR Northern Sky Map from 128 MHz to 175 MHz

    Get PDF
    We present a new method for interferometric imaging that is ideal for the large fields of view and compact arrays common in 21 cm cosmology. We first demonstrate the method with the simulations for two very different low-frequency interferometers, the Murchison Widefield Array and the MIT Epoch of Reionization (MITEoR) experiment. We then apply the method to the MITEoR data set collected in 2013 July to obtain the first northern sky map from 128 to 175 MHz at ∼2° resolution and find an overall spectral index of −2.73 ± 0.11. The success of this imaging method bodes well for upcoming compact redundant low-frequency arrays such as Hydrogen Epoch of Reionization Array. Both the MITEoR interferometric data and the 150 MHz sky map are available at http://space.mit.edu/home/tegmark/omniscope.html.National Science Foundation (U.S.) (AST-0908848)National Science Foundation (U.S.) (AST-1105835)National Science Foundation (U.S.) (AST-1440343

    Mapping our universe in 3D with MITEoR

    Get PDF
    Mapping our universe in 3D by imaging the redshifted 21 cm line from neutral hydrogen has the potential to overtake the cosmic microwave background as our most powerful cosmological probe, because it can map a much larger volume of our Universe, shedding new light on the epoch of reionization, inflation, dark matter, dark energy, and neutrino masses. We report on MITEoR, a pathfinder low-frequency radio interferometer whose goal is to test technologies that greatly reduce the cost of such 3D mapping for a given sensitivity. MITEoR accomplishes this by using massive baseline redundancy both to enable automated precision calibration and to cut the correlator cost scaling from N[superscript 2] to N log N, where N is the number of antennas. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious HERA project, which incorporates many identical or similar technologies using an order of magnitude more antennas, each with dramatically larger collecting area.National Science Foundation (U.S.) (Grant AST-0908848)National Science Foundation (U.S.) (Grant AST-1105835)MIT Kavli Instrumentation FundMassachusetts Institute of Technology. Undergraduate Research Opportunities Progra

    Observing the Evolution of the Universe

    Full text link
    How did the universe evolve? The fine angular scale (l>1000) temperature and polarization anisotropies in the CMB are a Rosetta stone for understanding the evolution of the universe. Through detailed measurements one may address everything from the physics of the birth of the universe to the history of star formation and the process by which galaxies formed. One may in addition track the evolution of the dark energy and discover the net neutrino mass. We are at the dawn of a new era in which hundreds of square degrees of sky can be mapped with arcminute resolution and sensitivities measured in microKelvin. Acquiring these data requires the use of special purpose telescopes such as the Atacama Cosmology Telescope (ACT), located in Chile, and the South Pole Telescope (SPT). These new telescopes are outfitted with a new generation of custom mm-wave kilo-pixel arrays. Additional instruments are in the planning stages.Comment: Science White Paper submitted to the US Astro2010 Decadal Survey. Full list of 177 author available at http://cmbpol.uchicago.ed

    Low frequency observations of linearly polarized structures in the interstellar medium near the south Galactic pole

    Get PDF
    This is an author-created, un-copyedited version of an article published in The Astrophysical Journal. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at https://doi.org/10.3847/0004-637X/830/1/38We present deep polarimetric observations at 154 MHz with the Murchison Widefield Array (MWA), covering 625 deg^2 centered on RA=0 h, Dec=-27 deg. The sensitivity available in our deep observations allows an in-band, frequency-dependent analysis of polarized structure for the first time at long wavelengths. Our analysis suggests that the polarized structures are dominated by intrinsic emission but may also have a foreground Faraday screen component. At these wavelengths, the compactness of the MWA baseline distribution provides excellent snapshot sensitivity to large-scale structure. The observations are sensitive to diffuse polarized emission at ~54' resolution with a sensitivity of 5.9 mJy beam^-1 and compact polarized sources at ~2.4' resolution with a sensitivity of 2.3 mJy beam^-1 for a subset (400 deg^2) of this field. The sensitivity allows the effect of ionospheric Faraday rotation to be spatially and temporally measured directly from the diffuse polarized background. Our observations reveal large-scale structures (~1 deg - 8 deg in extent) in linear polarization clearly detectable in ~2 minute snapshots, which would remain undetectable by interferometers with minimum baseline lengths >110 m at 154 MHz. The brightness temperature of these structures is on average 4 K in polarized intensity, peaking at 11 K. Rotation measure synthesis reveals that the structures have Faraday depths ranging from -2 rad m^-2 to 10 rad m^-2 with a large fraction peaking at ~+1 rad m^-2. We estimate a distance of 51+/-20 pc to the polarized emission based on measurements of the in-field pulsar J2330-2005. We detect four extragalactic linearly polarized point sources within the field in our compact source survey. Based on the known polarized source population at 1.4 GHz and non-detections at 154 MHz, we estimate an upper limit on the depolarization ratio of 0.08 from 1.4 GHz to 154 MHz.Peer reviewedFinal Accepted Versio

    Critical Behavior in Physics and Probabilistic Formal Languages

    No full text
    We show that the mutual information between two symbols, as a function of the number of symbols between the two, decays exponentially in any probabilistic regular grammar, but can decay like a power law for a context-free grammar. This result about formal languages is closely related to a well-known result in classical statistical mechanics that there are no phase transitions in dimensions fewer than two. It is also related to the emergence of power law correlations in turbulence and cosmological inflation through recursive generative processes. We elucidate these physics connections and comment on potential applications of our results to machine learning tasks like training artificial recurrent neural networks. Along the way, we introduce a useful quantity, which we dub the rational mutual information, and discuss generalizations of our claims involving more complicated Bayesian networks. Keywords: formal languages; statistical mechanics; criticalit

    Consciousness as a state of matter

    No full text
    We examine the hypothesis that consciousness can be understood as a state of matter, “perceptronium”, with distinctive information processing abilities. We explore four basic principles that may distinguish conscious matter from other physical systems such as solids, liquids and gases: the information, integration, independence and dynamics principles. If such principles can identify conscious entities, then they can help solve the quantum factorization problem: why do conscious observers like us perceive the particular Hilbert space factorization corresponding to classical space (rather than Fourier space, say), and more generally, why do we perceive the world around us as a dynamic hierarchy of objects that are strongly integrated and relatively independent? Tensor factorization of matrices is found to play a central role, and our technical results include a theorem about Hamiltonian separability (defined using Hilbert–Schmidt superoperators) being maximized in the energy eigenbasis. Our approach generalizes Giulio Tononi’s integrated information framework for neural-network-based consciousness to arbitrary quantum systems, and we find interesting links to error-correcting codes, condensed matter criticality, and the Quantum Darwinism program, as well as an interesting connection between the emergence of consciousness and the emergence of time.National Science Foundation (U.S.) (AST-090884)National Science Foundation (U.S.) (AST-1105835
    corecore